Facebook, Show Us the Mess
This article is part of the On Tech newsletter. Here is a collection of past columns. A pile of internal communications has given us a rare …
This article is part of the On Tech newsletter. Here is a collection of past columns.
A pile of internal communications has given us a rare, unvarnished look into Facebook’s self-examinations and deliberations over how people are influenced by the company’s product designs and decisions.
Perhaps the public and Facebook would benefit if these glimpses weren’t so rare. Facebook and other internet powers could help us understand the world by showing us a little more of the messy reality of running virtual hangouts for billions of humans.
Something that has pleasantly surprised me from the reporting on the documents collected by Frances Haugen, the former Facebook product manager, is how much thought and care Facebook employees seemed to have devoted to assessing the company’s apps and the ways they shape what people do and how communities and societies behave. Facebook, show us this side of yourself.
Casey Newton, a technology writer, made this case last month: “What if Facebook routinely published its findings and allowed its data to be audited? What if the company made it dramatically easier for qualified researchers to study the platform independently?”
And what if other companies in technology did the same?
Imagine if Facebook had explained out loud the ways that it wrestled with restricting posts with false information about fraud after the 2020 U.S. presidential election and whether that risked silencing legitimate political discussions.
What if Facebook had shared with the public its private assessments of the ways that features to easily share lots of posts amplified hateful or bullying posts?
Imagine if Facebook employees involved in major product design changes could — like the U.S. Supreme Court justices — write dissenting opinions explaining their disagreements to the public.
I know that some, or all, of that sounds like a fantasy. Organizations have legitimate reasons to keep secrets, including to protect their employees and customers.
But Facebook is not an ordinary organization. It’s among a tiny number of corporations whose products help shape how humans behave and what we believe.
Learning more about what Facebook knows about the world would help improve our understanding of one another, and of Facebook. It would give outsiders an opportunity to validate, challenge and add to Facebook’s self assessments. And it might make the company a little more trustworthy and understood.
Facebook has said that it believed the reporting about its internal communications has lacked nuance and context. Its reaction has included clamping down on internal deliberations to minimize leaks. And in my conversations with people in technology this week, there is a fear that Facebook, YouTube, Twitter and others will respond to weeks of tough reporting on Facebook by probing less into the effects of their products, or keeping what they learn under lock and key.
But another way is to be more open and reveal far more. That wouldn’t be entirely out of character for Facebook.
In 2015, the company publicly released and discussed research by its data scientists that found that the social network didn’t worsen the problem of “filter bubbles,” in which people see only information that confirms their beliefs. In 2018, Mark Zuckerberg published a lengthy post detailing the company’s examination of how people on Facebook responded to material that was salacious or offensive. The same year, Facebook disclosed an ambitious plan to share huge amounts of posts and other user data with outside researchers to study harmful information.
These efforts were far from perfect. Notably, the independent research consortium was dogged by botched data and disputes over preserving people’s privacy. But the efforts show that Facebook at times has wanted to be more open.
Nathaniel Persily, a Stanford Law School professor who was previously co-chair of the research consortium, recently drafted text for legislation that could grant independent researchers access to information about internet companies’ inner workings.
He told me that he thought of the research consortium as “road kill on the highway to something glorious,” which would be both voluntary and forced transparency by large internet companies. He praised Twitter, which last week released an analysis of the ways its computer systems in some cases amplified views on the political right more than those on the left.
Twitter’s research was incomplete. The company said it didn’t know why some messages circulated more than others. But Twitter was honest about what it knew and didn’t, and gave the public and researchers opportunities for further investigation. It showed us the mess.
Understand the Facebook Papers
A tech giant in trouble. The leak of internal documents by a former Facebook employee has provided an intimate look at the operations of the secretive social media company and renewed calls for better regulations of the company’s wide reach into the lives of its users.
How it began. In September, The Wall Street Journal published The Facebook Files, a series of reports based on leaked Facebook documents. The series exposed evidence that Facebook knew Instagram, one of its products, was worsening body-image issues among teenagers.
The whistle-blower. During an interview with “60 Minutes” that aired Oct. 3, Frances Haugen, a Facebook product manager who left the company in May, revealed that she was responsible for the leak of those internal documents.
Ms. Haugen’s testimony in Congress. On Oct. 5, Ms. Haugen testified before a Senate subcommittee, saying that Facebook was willing to use hateful and harmful content on its site to keep users coming back. Facebook executives, including Mark Zuckerberg, called her accusations untrue.
The Facebook Papers. Ms. Haugen also filed a complaint with the Securities and Exchange Commission and provided the documents to Congress in redacted form. A congressional staff member then supplied the documents, known as the Facebook Papers, to several news organizations, including The New York Times.
New revelations. Documents from the Facebook Papers show the degree to which Facebook knew of extremist groups on its site trying to polarize American voters before the election. They also reveal that internal researchers had repeatedly determined how Facebook’s key features amplified toxic content on the platform.
More about Facebook from New York Times Opinion:
Farhad Manjoo: Misguided congressional proposals intended to fix Facebook are worse than no legislation at all.
Greg Bensinger: “Facebook has demonstrated it won’t address its systemic problems until forced to do so. Now, it appears, only advertisers can make the status quo unprofitable and unsustainable.”
Kara Swisher: Mark Zuckerberg is no longer the adored leader and cultural touchstone at Facebook.
Before we go …
Giant tech companies are still great at money: Google and Microsoft made $$$$. Twitter is doing fine, too.
Would you upload your passport to watch YouTube? My colleague David McCabe reports that more companies and countries are opting for digital age checks to try to keep young children out of everything from video games to online pornography. But it’s tricky to balance the benefits of anonymity online while keeping kids safe.
Amazon is taking a stab at talk radio, sort of: The Verge writes that Amazon is building a new app that would let anyone create a live audio show and let listeners chime in with their voice. Is this clever or weird, or both?
Hugs to this
This is a Twitter thread of cows and beans that resemble them. For real. (I saw this first in the Garbage Day newsletter.)
We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at [email protected].
If you don’t already get this newsletter in your inbox, please sign up here. You can also read past On Tech columns.